On steepest descent curves for quasi convex families in Rn

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Steepest Descent Algorithms for Discrete Convex Functions

This paper investigates the complexity of steepest descent algorithms for two classes of discrete convex functions, M-convex functions and L-convex functions. Simple tie-breaking rules yield complexity bounds that are polynomials in the dimension of the variables and the size of the effective domain. Combination of the present results with a standard scaling approach leads to an efficient algor...

متن کامل

Steepest descent method on a Riemannian manifold: the convex case

In this paper we are interested in the asymptotic behavior of the trajectories of the famous steepest descent evolution equation on Riemannian manifolds. It writes ẋ (t) + gradφ (x (t)) = 0. It is shown how the convexity of the objective function φ helps in establishing the convergence as time goes to infinity of the trajectories towards points that minimize φ. Some numerical illustrations are ...

متن کامل

A hybrid steepest descent method for constrained convex optimization

This paper describes a hybrid steepest descent method to decrease over time any given convex cost function while keeping the optimization variables into any given convex set. The method takes advantage of properties of hybrid systems to avoid the computation of projections or of a dual optimum. The convergence to a global optimum is analyzed using Lyapunov stability arguments. A discretized imp...

متن کامل

Continuous Steepest Descent Path for Traversing Non-convex Regions

This paper revisits the ideas of seeking unconstrained minima by following a continuous steepest descent path (CSDP). We are especially interested in the merits of such an approach in regions where the objective function is non-convex and Newton-like methods become ineffective. The paper combines ODE-trajectory following with trust-region ideas to give an algorithm which performs curvilinear se...

متن کامل

Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming

We modify the first order algorithm for convex programming described by Nesterov in his book [5]. In his algorithms, Nesterov makes explicit use of a Lipschitz constant L for the function gradient, which is either assumed known [5], or is estimated by an adaptive procedure [7]. We eliminate the use of L at the cost of an extra imprecise line search, and obtain an algorithm which keeps the optim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematische Nachrichten

سال: 2014

ISSN: 0025-584X

DOI: 10.1002/mana.201300133